A Finite-Sample, Distribution-Free, Probabilistic Lower Bound on Mutual Information
نویسندگان
چکیده
For any memoryless communication channel with a binary-valued input and a one-dimensional real-valued output, we introduce a probabilistic lower bound on the mutual information given empirical observations on the channel. The bound is built on the Dvoretzky-Kiefer-Wolfowitz inequality and is distribution free. A quadratic time algorithm is described for computing the bound and its corresponding class-conditional distribution functions. We compare our approach to existing techniques and show the superiority of our bound to a method inspired by Fano's inequality where the continuous random variable is discretized.
منابع مشابه
A Tight Lower Bound on the Mutual Information of a Binary and an Arbitrary Finite Random Variable in Dependence of the Variational Distance
”THIS PAPER IS ELIGIBLE FOR THE STUDENT PAPER AWARD”. In this paper a numerical method is presented, which finds a lower bound for the mutual information between a binary and an arbitrary finite random variable with joint distributions that have a variational distance not greater than a known value to a known joint distribution. This lower bound can be applied to mutual information estimation w...
متن کاملGeneralized EXIT chart and BER analysis of finite-length turbo codes
We propose an analysis tool of the finite-length iterative turbo decoding algorithm. The proposed tool is a generalized EXIT chart based on the mutual information transfer characteristics of the extrinsic information in the iterative turbo decoding algorithm. The proposed tool can describe the probabilistic convergence behavior of the iterative decoding algorithm. By using this tool, we obtain ...
متن کاملA Multi-Period 1-Center Location Problem in the Presence of a Probabilistic Line Barrier
This paper investigates a multi-period rectilinear distance 1-center location problem considering a line-shaped barrier, in which the starting point of the barrier follows the uniform distribution function. In addition, the existing points are sensitive to demands and locations. The purpose of the presented model is to minimize the maximum barrier distance from the new facility to the existing ...
متن کاملConfidence Intervals for the Mutual Information
”THIS PAPER IS ELIGIBLE FOR THE STUDENT PAPER AWARD” By combining a bound on the absolute value of the difference of mutual information between two joint probability distributions with a fixed variational distance, and a bound on the probability of a maximal deviation in variational distance between a true joint probability distribution and an empirical joint probability distribution, confidenc...
متن کاملOrder Statistics and Probabilistic Robust Control
Order statistics theory is applied in this paper to probabilistic robust control theory to compute the minimum sample size needed to come up with a reliable estimate of an uncertain quantity under continuity assumption of the related probability distribution. Also, the concept of distribution-free tolerance intervals is applied to estimate the range of an uncertain quantity and extract the info...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Neural computation
دوره 23 7 شماره
صفحات -
تاریخ انتشار 2011